Recurrent Neural Networks with Iterated Function Systems Dynamics
نویسندگان
چکیده
We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and faster), 2) the RNN state part codes the information processing states in the symbolic input stream in a well-organized and intuitively appealing way. We show that there is a direct correspondence between the R enyi entropy spectra characterizing the input stream and the spectra of R enyi generalized dimensions of activations inside the RNN state space. We test both the new RNN model with IFS dynamics and its conventional counterpart with trainable recurrent part on two chaotic symbolic sequences. In our experiments, RNNs with IFS dynamics outperform the conventional RNNs with respect to information theoretic measures computed on the training and model generated sequences.
منابع مشابه
A Gradient Descent Method for a Neural Fractal Memory
It has been demonstrated that higher order recurrent neural networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors o er a very e cent mechanism to encode visual memories in a neural substrate, since even a simple twelve weight network can encode a very large set of di erent images. The main problem in this memory model, which so far has remai...
متن کاملAdaptive Leader-Following and Leaderless Consensus of a Class of Nonlinear Systems Using Neural Networks
This paper deals with leader-following and leaderless consensus problems of high-order multi-input/multi-output (MIMO) multi-agent systems with unknown nonlinear dynamics in the presence of uncertain external disturbances. The agents may have different dynamics and communicate together under a directed graph. A distributed adaptive method is designed for both cases. The structures of the contro...
متن کاملRecurrent Networks: State Machines Or Iterated Function Systems?
words, the recurrent network states are not IP states in of themselves; they require an appropriate context which can elevate them to IP-hood. This context consists of a set of input sequences and an observation method for generating outputs. While the recurrent network’s state dynamics may be described as an IFS, any IP interpretation will involve a holistic combination of the set of possible ...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملPerformance Analysis of a New Neural Network for Routing in Mesh Interconnection Networks
Routing is one of the basic parts of a message passing multiprocessor system. The routing procedure has a great impact on the efficiency of a system. Neural algorithms that are currently in use for computer networks require a large number of neurons. If a specific topology of a multiprocessor network is considered, the number of neurons can be reduced. In this paper a new recurrent neural ne...
متن کامل